Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add filters

Database
Language
Document Type
Year range
1.
Big Data and Society ; 8(2), 2021.
Article in English | Scopus | ID: covidwho-1448153

ABSTRACT

The spreading of COVID-19 misinformation on social media could have severe consequences on people's behavior. In this paper, we investigated the emotional expression of misinformation related to the COVID-19 crisis on Twitter and whether emotional valence differed depending on the type of misinformation. We collected 17,463,220 English tweets with 76 COVID-19-related hashtags for March 2020. Using Google Fact Check Explorer API we identified 226 unique COVID-19 false stories for March 2020. These were clustered into six types of misinformation (cures, virus, vaccine, politics, conspiracy theories, and other). Applying the 226 classifiers to the Twitter sample we identified 690,004 tweets. Instead of running the sentiment on all tweets we manually coded a random subset of 100 tweets for each classifier to increase the validity, reducing the dataset to 2,097 tweets. We found that only a minor part of the entire dataset was related to misinformation. Also, misinformation in general does not lean towards a certain emotional valence. However, looking at comparisons of emotional valence for different types of misinformation uncovered that misinformation related to “virus” and “conspiracy” had a more negative valence than “cures,” “vaccine,” “politics,” and “other.” Knowing from existing studies that negative misinformation spreads faster, this demonstrates that filtering for misinformation type is fruitful and indicates that a focus on “virus” and “conspiracy” could be one strategy in combating misinformation. As emotional contexts affect misinformation spreading, the knowledge about emotional valence for different types of misinformation will help to better understand the spreading and consequences of misinformation. © The Author(s) 2021.

SELECTION OF CITATIONS
SEARCH DETAIL